Goto

Collaborating Authors

 Fukushima Prefecture


Inside the Dirty, Dystopian World of AI Data Centers

The Atlantic - Technology

This story appears in the April 2026 print edition. While some stories from this issue are not yet available to read online, you can explore more from the magazine . Get our editors' guide to what matters in the world, delivered to your inbox every weekday. The race to power AI is already remaking the physical world. Three Mile Island's cooling towers have until recently served as grave markers for America's nuclear-power industry. A s we drove through southwest Memphis, KeShaun Pearson told me to keep my window down--our destination was best tasted, not viewed. Along the way, we passed an abandoned coal plant to our right, then an active power plant to our left, equipped with enormous natural-gas turbines. Pearson, who directs the nonprofit Memphis Community Against Pollution, was bringing me to his hometown's latest industrial megaproject.


Japan eyes distant island for nuclear waste dump

Popular Science

Minamitorishima is nearly 1,250 miles east of Tokyo. The island is surrounded by a coral atoll and is only 0.6 miles wide. Breakthroughs, discoveries, and DIY tips sent six days a week. Nuclear power is on the rise around the world, but with it comes an extremely pressing question: where will all of the radioactive waste be stored? For Japan, one answer may lie in literally the most remote location at their disposal.


Learning from Complexity: Exploring Dynamic Sample Pruning of Spatio-Temporal Training

Chen, Wei, Chen, Junle, Wu, Yuqian, Liang, Yuxuan, Zhou, Xiaofang

arXiv.org Machine Learning

Spatio-temporal forecasting is fundamental to intelligent systems in transportation, climate science, and urban planning. However, training deep learning models on the massive, often redundant, datasets from these domains presents a significant computational bottleneck. Existing solutions typically focus on optimizing model architectures or optimizers, while overlooking the inherent inefficiency of the training data itself. This conventional approach of iterating over the entire static dataset each epoch wastes considerable resources on easy-to-learn or repetitive samples. In this paper, we explore a novel training-efficiency techniques, namely learning from complexity with dynamic sample pruning, ST-Prune, for spatio-temporal forecasting. Through dynamic sample pruning, we aim to intelligently identify the most informative samples based on the model's real-time learning state, thereby accelerating convergence and improving training efficiency. Extensive experiments conducted on real-world spatio-temporal datasets show that ST-Prune significantly accelerates the training speed while maintaining or even improving the model performance, and it also has scalability and universality.








Large Language Models as Urban Residents: An LLM Agent Framework for Personal Mobility Generation

Neural Information Processing Systems

This paper introduces a novel approach using Large Language Models (LLMs) integrated into an agent framework for flexible and effective personal mobility generation. LLMs overcome the limitations of previous models by effectively processing semantic data and offering versatility in modeling various tasks.